Towards Query-Efficient Black-Box Adversary with Zeroth-Order Natural Gradient Descent

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Query-Efficient Black-box Adversarial Examples

Current neural network-based image classifiers are susceptible to adversarial examples, even in the black-box setting, where the attacker is limited to query access without access to gradients. Previous methods — substitute networks and coordinate-based finite-difference methods — are either unreliable or query-inefficient, making these methods impractical for certain problems. We introduce a n...

متن کامل

Energetic Natural Gradient Descent

In this appendix we show that 1 2 ∆ F (θ)∆ is a second order Taylor approximation of D KL (p(θ)p(θ + ∆)). First, let g q (θ) :=D KL (qp(θ)) = ω∈Ω q(ω) ln q(ω) p(ω|θ). We begin by deriving equations for the Jacobian and Hessian of g q at θ: ∂g q (θ) ∂θ = ω∈Ω q(ω) p(ω|θ) q(ω) ∂ ∂θ q(ω) p(ω|θ) = ω∈Ω q(ω) p(ω|θ) q(ω) −q(ω) ∂p(ω|θ) ∂θ p(ω|θ) 2 = ω∈Ω − q(ω) p(ω|θ) ∂p(ω|θ) ∂θ , (4) and so: ∂ 2 g q (θ)...

متن کامل

Energetic Natural Gradient Descent

We propose a new class of algorithms for minimizing or maximizing functions of parametric probabilistic models. These new algorithms are natural gradient algorithms that leverage more information than prior methods by using a new metric tensor in place of the commonly used Fisher information matrix. This new metric tensor is derived by computing directions of steepest ascent where the distance ...

متن کامل

Efficiency Bounds for Adversary Constructions in Black-Box Reductions

We establish a framework for bounding the efficiency of cryptographic reductions in terms of their security transfer. While efficiency bounds for the reductions have been studied for about ten years, the main focus has been the efficiency of the construction mostly measured by the number of calls to the basic primitive by the constructed primitive. Our work focuses on the efficiency of the wrap...

متن کامل

Intrinsic Plasticity via Natural Gradient Descent

This paper introduces the natural gradient for intrinsic plasticity, which tunes a neuron’s activation function such that its output distribution becomes exponentially distributed. The information-geometric properties of the intrinsic plasticity potential are analyzed and the improved learning dynamics when using the natural gradient are evaluated for a variety of input distributions. The appli...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the AAAI Conference on Artificial Intelligence

سال: 2020

ISSN: 2374-3468,2159-5399

DOI: 10.1609/aaai.v34i04.6173